Distribution-Dependent Vapnik-Chervonenkis Bounds

نویسندگان

  • Nicolas Vayatis
  • Robert Azencott
چکیده

Vapnik-Chervonenkis (VC) bounds play an important role in statistical learning theory as they are the fundamental result which explains the generalization ability of learning machines. There have been consequent mathematical works on the improvement of VC rates of convergence of empirical means to their expectations over the years. The result obtained by Talagrand in 1994 seems to provide more or less the final word to this issue as far as universal bounds are concerned. Though for fixed distributions, this bound can be practically outperformed. We show indeed that it is possible to replace the 2 2 under the exponential of the deviation term by the corresponding Cramér transform as shown by large deviations theorems. Then, we formulate rigorous distributionsensitive VC bounds and we also explain why these theoretical results on such bounds can lead to practical estimates of the effective VC dimension of learning structures.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Making Vapnik-Chervonenkis bounds accurate

This chapter shows how returning to the combinatorial nature of the Vapnik-Chervonenkis bounds provides simple ways to increase their accuracy, take into account properties of the data and of the learning algorithm, and provide empirically accurate estimates of the deviation between training error and testing error.

متن کامل

Vapnik-chervonenkis Density in Some Theories without the Independence Property, I

We recast the problem of calculating Vapnik-Chervonenkis (VC) density into one of counting types, and thereby calculate bounds (often optimal) on the VC density for some weakly o-minimal, weakly quasi-o-minimal, and P -minimal theories.

متن کامل

Vapnik-Chervonenkis Density in Some Theories without the Independence Property, II

We study the Vapnik-Chervonenkis (VC) density of definable families in certain stable first-order theories. In particular we obtain uniform bounds on VC density of definable families in finite U-rank theories without the finite cover property, and we characterize those abelian groups for which there exist uniform bounds on the VC density of definable families.

متن کامل

Sign rank versus Vapnik-Chervonenkis dimension

This work studies the maximum possible sign rank of sign (N ×N)-matrices with a given Vapnik-Chervonenkis dimension d. For d = 1, this maximum is three. For d = 2, this maximum is Θ̃(N). For d > 2, similar but slightly less accurate statements hold. The lower bounds improve on previous ones by Ben-David et al., and the upper bounds are novel. The lower bounds are obtained by probabilistic constr...

متن کامل

Improved Uniform Test Error Bounds

We derive distribution free uniform test error bounds that improve on VC type bounds for validation We show how to use knowledge of test inputs to improve the bounds The bounds are sharp but they require intense computation We introduce a method to trade sharpness for speed of computation Also we compute the bounds for several test cases Key wordsmachine learning learning theory generalization ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999